1) HDP Download Link:

https://hortonworks.com/downloads/#sandbox

2) HDP Installation Guide Link:

https://hortonworks.com/tutorial/sandbox-deployment-and-install-guide/section/1/

3) Guide to reset admin password

https://hortonworks.com/tutorial/learning-the-ropes-of-the-hortonworks-sandbox/#setup-ambari-admin-password

4) JDK Location

JAVA_HOME="/usr/lib/jvm/java"

5) Add the following line to the file C:\Windows\System32\drivers\etc\hosts at local machine

127.0.0.1       localhost sandbox.hortonworks.com sandbox-hdp.hortonworks.com sandbox-hdf.hortonworks.com

6) HDP Sandbon Quick Link:

http://localhost:8888/splash2.html

7) HDP Heartbeat Lost

	7.1 ambari-server stop
	7.2 ambari-agent stop
	7.3 ambari-agent start
	7.4 ambari-server start
	7.5 Sometimes need to increase server.startup.web.timeout, by default 50 seconds, which is located at /etc/ambari-server/conf/ambari.properties at HDP Sandbox
	
8) If encounter the following error:

	-bash: ./install_check.sh: /bin/bash^M: bad interpreter: No such file or directory 	

   That is because use windows based edit tool to modify scripts and upload into sandbox, run the following to fix it

   ex -bsc '%!awk "{sub(/\r/,\"\")}1"' -cx script file name

9) StreamSets Data Collector Installation

	9.1 Installation Guide URL
	
		https://streamsets.com/documentation/datacollector/latest/help/index.html#Installation/Installing_the_DC.html#task_bt1_zcp_kq   

	9.2 Login localhost:4200 and unpack the file at HDP sandbox

		cd /root/TrainingOnHDP/
		tar xvzf streamsets-datacollector-all-3.0.0.0.tgz

	9.3 Open sdc.properties at /root/TrainingOnHDP/streamsets-datacollector-3.0.0.0/etc and make the following port change

		http.port=16030

	9.4 Run Data Collector

		streamsets-datacollector-3.0.0.0/bin/streamsets dc

	9.5 Browse to http://localhost:16030/

		The default username and password are “admin” and “admin”.
		
10) Nifi 1.2.0 Installation

	10.1 Installation GUide URL
		
		https://hortonworks.com/downloads
		
	10.2 Goto Ambari Console http://localhost:8080 to stop Nifi Service (old version)

	10.3 Upload nifi-1.2.0.3.0.2.0-76-bin.tar.gz to /root/TrainingOnHDP/ on HDP sandbox

	10.4 Login localhost:4200 and unpack the file at HDP sandbox

		cd /root/TrainingOnHDP/
		tar xvzf nifi-1.2.0.3.0.2.0-76-bin.tar.gz
		
	10.5 Open nifi.properties at /root/TrainingOnHDP/nifi-1.2.0.3.0.2.0-76/conf and make the following port change (was 8080)

		nifi.web.http.port=9090	
	
	10.6 Run Nifi
	
		/root/TrainingOnHDP/nifi-1.2.0.3.0.2.0-76/bin/nifi.sh start
		
	10.7 Browse to http://localhost:9090/nifi

11) ElasticSeach Installation
	
	11.1 Installation URL
		
		https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-6.1.1.tar.gz
	
	11.2 Upload elasticsearch-6.1.1.tar.gz to /root/TrainingOnHDP/ on HDP sandbox

	11.3 Login localhost:4200 and unpack the file at HDP sandbox

		cd /root/TrainingOnHDP/
		tar xvzf elasticsearch-6.1.1.tar.gz
		
	11.4 Open elasticsearch.yml at /root/TrainingOnHDP/elasticsearch-6.1.1/config and make the following port change (was 9200)

		http.port: 9200
			
	11.5 Start Elastic Search

		useradd elastic
		passwd elastic
		su elastic
			
		/root/TrainingOnHDP/elasticsearch-6.1.1/bin/elasticsearch		
	
12) Kibana Installation	
		
	12.1 Installation URL
		
		https://artifacts.elastic.co/downloads/kibana/kibana-6.1.1-linux-x86_64.tar.gz
	
	12.2 Upload kibana-6.1.1-linux-x86_64.tar.gz to /root/TrainingOnHDP/ on HDP sandbox

	12.3 Login localhost:4200 and unpack the file at HDP sandbox

		cd /root/TrainingOnHDP/
		tar xvzf kibana-6.1.1-linux-x86_64.tar.gz
			
	12.4 Open kibana.yml at /root/TrainingOnHDP/kibana-6.1.1-linux-x86_64 and make the following port change (was 5601)

		server.port: 8744
		server.host: "0.0.0.0"
			
	12.5 Start Kibana
		
		/root/TrainingOnHDP/kibana-6.1.1-linux-x86_64/bin/kibana
			
	12.6 Goto kibana console http://localhost:8744

13) Installation ElasticSearch Interpreter for Zeppelin

	cd /usr/hdp/current/zeppelin-server/bin
	./install-interpreter.sh --name elasticsearch
	ls -la /usr/hdp/current/zeppelin-server/interpreter/	
	Login Ambaro, add the following to Zeppelin advanced configuration: ,org.apache.zeppelin.elasticsearch.ElasticsearchInterpreter
	restart Zeppelin		

14) Installation Stanford CoreNLP Server

	14.1 Installation and Setup: http://stanfordnlp.github.io/CoreNLP/corenlp-server.html
			
	14.2 Full Deployment:  http://nlp.stanford.edu/software/stanford-corenlp-full-2016-10-31.zip
			
	14.3 Upload stanford-corenlp-full-2016-10-31.zip to /root/TrainingOnHDP/ on HDP sandbox 
			
	14.4 Login localhost:4200 and unpack the file at HDP sandbox

		cd /root/TrainingOnHDP/
		unzip stanford-corenlp-full-2016-10-31.zip
				
	14.5 Start CoreNLP Server
				
		cd /root/TrainingOnHDP/stanford-corenlp-full-2016-10-31	
				
		java -mx4g -cp "*" edu.stanford.nlp.pipeline.StanfordCoreNLPServer -port 9000 -timeout 15000	

	14.6 Test

		curl --data 'This is greatest test ever.' 'http://localhost:9000/?properties={%22annotators%22%3A%22sentiment%22%2C%22outputFormat%22%3A%22json%22}' -o -			
				
		wget --post-data 'This is the worst way to test sentiment ever.' 'localhost:9000/?properties={"annotators":"sentiment","outputFormat":"json"}' -O -
				
					
	